Improving the Speed of Support Vector Regression Using Regularized Least Square Regression
نویسندگان
چکیده
منابع مشابه
Learning Rates of Least-Square Regularized Regression
This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...
متن کاملRobust Least Square Support Vector Regression for Contaminated Data Modeling
Weighted least squares support vector machine (WLSSVM) is a robust version of least squares support vector machine (LS-SVM). It adds weights on error variables to eliminate the influence of outliers. But the weights, which largely depend on the original regression errors from unweighted LS-SVM, might be unreliable for correcting the biased estimation of LS-SVM, especially for the training data ...
متن کاملSquare Penalty Support Vector Regression
Support Vector Regression (SVR) is usually pursued using the 2–insensitive loss function while, alternatively, the initial regression problem can be reduced to a properly defined classification one. In either case, slack variables have to be introduced in practical interesting problems, the usual choice being the consideration of linear penalties for them. In this work we shall discuss the solu...
متن کاملError analysis of regularized least-square regression with Fredholm kernel
Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized ...
متن کاملApplication of integral operator for regularized least-square regression
In this paper, we study the consistency of the regularized least square regression in a general reproducing kernel Hilbert spaces. We characterized the compactness of the inclusion map from a reproducing kernel Hilbert space to the space of continuous functions and showed that the capacity based analysis by uniform covering numbers may fail in a very general setting. We prove the consistency an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Ingénierie des systèmes d information
سال: 2020
ISSN: 1633-1311,2116-7125
DOI: 10.18280/isi.250404